L2 Syntactic Complexity Analyzer - definition. What is L2 Syntactic Complexity Analyzer
DICLIB.COM
أدوات لغة الذكاء الاصطناعي
أدخل كلمة أو عبارة بأي لغة 👆
اللغة:     

ترجمة وتحليل الكلمات بواسطة الذكاء الاصطناعي

في هذه الصفحة يمكنك الحصول على تحليل مفصل لكلمة أو عبارة باستخدام أفضل تقنيات الذكاء الاصطناعي المتوفرة اليوم:

  • كيف يتم استخدام الكلمة في اللغة
  • تردد الكلمة
  • ما إذا كانت الكلمة تستخدم في كثير من الأحيان في اللغة المنطوقة أو المكتوبة
  • خيارات الترجمة إلى الروسية أو الإسبانية، على التوالي
  • أمثلة على استخدام الكلمة (عدة عبارات مع الترجمة)
  • أصل الكلمة

%ما هو (من)٪ 1 - تعريف


L2 Syntactic Complexity Analyzer         
L2 Syntactical Complexity Analyzer (L2SCA) developed by Xiaofei Lu at the Pennsylvania State University, is a computational tool which produces syntactic complexity indices of written English language texts. Along with Coh-Metrix, the L2SCA is one of the most extensively used computational tool to compute indices of second language writing development.
Computational complexity         
MEASURE OF THE AMOUNT OF RESOURCES NEEDED TO RUN AN ALGORITHM OR SOLVE A COMPUTATIONAL PROBLEM
Asymptotic complexity; Computational Complexity; Bit complexity; Context of computational complexity; Complexity of computation (bit); Computational complexities
In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) and memory storage requirements.
complexity         
PROFESSIONAL ESPORTS ORGANIZATION BASED IN THE UNITED STATES
Los Angeles Complexity; CompLexity Gaming; LA Complexity; Complexity LA; CompLexity; Team CompLexity; CoL.Black; CoL
<algorithm> The level in difficulty in solving mathematically posed problems as measured by the time, number of steps or arithmetic operations, or memory space required (called time complexity, computational complexity, and space complexity, respectively). The interesting aspect is usually how complexity scales with the size of the input (the "scalability"), where the size of the input is described by some number N. Thus an algorithm may have computational complexity O(N^2) (of the order of the square of the size of the input), in which case if the input doubles in size, the computation will take four times as many steps. The ideal is a constant time algorithm (O(1)) or failing that, O(N). See also NP-complete. (1994-10-20)